-
Couldn't load subscription status.
- Fork 6.5k
Laplace Scheduler for DDPM #11320
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Laplace Scheduler for DDPM #11320
Conversation
… (around log SNR=0), increasing performance (lower FID) with faster convergence speed, and robust to resolution and objective. Reference: https://arxiv.org/pdf/2407.03297.
|
@bot /style |
|
Style fixes have been applied. View the workflow run here. |
26a4688 to
6376fec
Compare
|
Of course! I am working on a background infilling problem for astronomical images. The Laplace scheduler gives me more consistent results across configurations, in general. Please take a look at the images below, which contain a significant region masked and then infilled with various trained models. All of them are taken from an evaluation batch, at epoch 25. Laplace Schedule Cosine Schedule Linear Schedule Laplace scheduler, empirically, is the more robust and the leas likely to blow up without clip sampling. Additionally, for large masks the Laplace scheduler consistently provides me with the smoother, most natural infills. I attach the numerical results of these experiments, in case they are helpful. Results P.S.: @a-r-r-o-w a ,@yiyixuxu is this okay? I tried pushing it from my Windows, which gave me some issues, so I had to force-revert to a previous version of the commit. |
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
|
ping? |
|
Hi all — just following up here! I’ve reset the PR to its clean state (Laplace scheduler only), provided generation results, and included comparisons showing consistent improvements in stability and quality across configurations. Please let me know if there’s anything else I should address or tweak... happy to follow up. Would love to see this scheduler added, especially given its robustness across loss types and masking setups. I've personally found more success using it for training inpainting models than other schedules, and it would be convenient to just have it added instead of writing it down every time I update diffusers. |
|
gentle ping @yiyixuxu @sayakpaul @DN6 |









What does this PR do?
This PR adds a Laplace noise scheduler that samples more frequently around mid-range noise levels (centered around log SNR = 0), following the formulation described in Improved Noise Schedule For Diffusion Training.
Improvements include:
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Core library: